Llama 4 Scout 17B 4E Instruct
Llama 4 Scout is a 17-billion-parameter multimodal model with a Mixture of Experts (MoE) architecture, introduced by Meta. It supports 12 languages and image understanding, featuring a topk=4 expert dynamic fusion mechanism.
Large Language Model
Transformers Supports Multiple Languages